Application of gradient descent method to the sedimentary grain-size distribution fitting

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Gradient-Descent Method for Curve Fitting on Riemannian Manifolds

Given data points p0, . . . , pN on a closed submanifold M of R n and time instants 0 = t0 < t1 < . . . < tN = 1, we consider the problem of finding a curve γ on M that best approximates the data points at the given instants while being as “regular” as possible. Specifically, γ is expressed as the curve that minimizes the weighted sum of a sum-of-squares term penalizing the lack of fitting to t...

متن کامل

Application of Gradient Steepest Descent Method to the Problem of Crystal Lattice Parametric Identification

The objective of this paperwork is the development of a crystal lattice parameter identification algorithm, which allows obtaining a more accurate solution compared to the Bravais unit cell estimation algorithm. To achieve the objective, we suggest solving the parameter identification problem using the steepest descent gradient method. The study of the parameter identification accuracy was cond...

متن کامل

Detecting Sedimentary Cycles using Autocorrelation of Grain size

Detection of sedimentary cycles is difficult in fine-grained or homogenous sediments but is a prerequisite for the interpretation of depositional environments. Here we use a new autocorrelation analysis to detect cycles in a homogenous sediment core, E602, from the northern shelf of the South China Sea. Autocorrelation coefficients were calculated for different mean grain sizes at various depth...

متن کامل

Cost-Sensitive Approach to Batch Size Adaptation for Gradient Descent

In this paper we propose a novel approach to automatically determine the batch size in stochastic gradient descent methods. The choice of the batch size induces a trade-off between the accuracy of the gradient estimate and the cost in terms of samples of each update. We propose to determine the batch size by optimizing the ratio between a lower bound to a linear or quadratic Taylor approximatio...

متن کامل

Learning to learn by gradient descent by gradient descent

The move from hand-designed features to learned features in machine learning has been wildly successful. In spite of this, optimization algorithms are still designed by hand. In this paper we show how the design of an optimization algorithm can be cast as a learning problem, allowing the algorithm to learn to exploit structure in the problems of interest in an automatic way. Our learned algorit...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Computational and Applied Mathematics

سال: 2009

ISSN: 0377-0427

DOI: 10.1016/j.cam.2009.09.005